Guaranteed Non-convex Optimization: Submodular Maximization over Continuous Domains
نویسندگان
چکیده
Submodular continuous functions are a category of (generally) non-convex/nonconcave functions with a wide spectrum of applications. We characterize these functions and demonstrate that they can be maximized efficiently with approximation guarantees. Specifically, I) for monotone submodular continuous functions with an additional diminishing returns property, we propose a Frank-Wolfe style algorithm with (1 − 1/e)-approximation, and sub-linear convergence rate; II) for general non-monotone submodular continuous functions, we propose a DoubleGreedy algorithm with 1/3-approximation. Submodular continuous functions naturally find applications in various real-world settings, including influence and revenue maximization with continuous assignments, sensor energy management, multi-resolution data summarization, facility location, etc. Experimental results show that the proposed algorithms efficiently generate superior solutions in terms of empirical objectives compared to baseline algorithms.
منابع مشابه
Non-monotone Continuous DR-submodular Maximization: Structure and Algorithms
DR-submodular continuous functions are important objectives with wide real-world applications spanning MAP inference in determinantal point processes (DPPs), and mean-field inference for probabilistic submodular models, amongst others. DR-submodularity captures a subclass of non-convex functions that enables both exact minimization and approximate maximization in polynomial time. In this work w...
متن کاملProjection-Free Online Optimization with Stochastic Gradient: From Convexity to Submodularity
Online optimization has been a successful framework for solving large-scale problems under computational constraints and partial information. Current methods for online convex optimization require either a projection or exact gradient computation at each step, both of which can be prohibitively expensive for large-scale applications. At the same time, there is a growing trend of non-convex opti...
متن کاملOnline Continuous Submodular Maximization
In this paper, we consider an online optimization process, where the objective functions are not convex (nor concave) but instead belong to a broad class of continuous submodular functions. We first propose a variant of the Frank-Wolfe algorithm that has access to the full gradient of the objective functions. We show that it achieves a regret bound of O( √ T ) (where T is the horizon of the onl...
متن کاملSubmodular Functions: from Discrete to Continous Domains
Submodular set-functions have many applications in combinatorial optimization, as they can be minimized and approximately maximized in polynomial time. A key element in many of the algorithms and analyses is the possibility of extending the submodular set-function to a convex function, which opens up tools from convex optimization. Submodularity goes beyond set-functions and has naturally been ...
متن کاملRisk-Sensitive Submodular Optimization
The conditional value at risk (CVaR) is a popular risk measure which enables risk-averse decision making under uncertainty. We consider maximizing the CVaR of a continuous submodular function, an extension of submodular set functions to a continuous domain. One example application is allocating a continuous amount of energy to each sensor in a network, with the goal of detecting intrusion or co...
متن کامل